112 research outputs found

    Turtle Geometry

    Get PDF
    Turtle Geometry presents an innovative program of mathematical discovery that demonstrates how the effective use of personal computers can profoundly change the nature of a student's contact with mathematics. Using this book and a few simple computer programs, students can explore the properties of space by following an imaginary turtle across the screen. The concept of turtle geometry grew out of the Logo Group at MIT. Directed by Seymour Papert, author of Mindstorms, this group has done extensive work with preschool children, high school students and university undergraduates

    MIT’s Strategy for Educational Technology Innovation, 1999–2003

    Get PDF
    This paper discusses the institutional framework and the strategic decisions the led the launch of several major major educational technology initiatives at MIT between 1999 and 2003. It describes how MIT’s central administration provided strategic support and coordination for large educational technology programs, and it traces how strategies evolved as work progressed through 2003 to a point where major projects had been launched and were ready to proceed as ongoing concerns. The history recounted here provides a snapshot of a world-class university confronting the changing environment for higher education engendered by information technology at beginning of the 21st century

    Structure and Interpretation of Computer Programs

    Get PDF
    Structure and Interpretation of Computer Programs has had a dramatic impact on computer science curricula over the past decade. This long-awaited revision contains changes throughout the text. There are new implementations of most of the major programming systems in the book, including the interpreters and compilers, and the authors have incorporated many small changes that reflect their experience teaching the course at MIT since the first edition was published. A new theme has been introduced that emphasizes the central role played by different approaches to dealing with time in computational models: objects with state, concurrent programming, functional programming and lazy evaluation, and nondeterministic programming. There are new example sections on higher-order procedures in graphics and on applications of stream processing in numerical programming, and many new exercises. In addition, all the programs have been reworked to run in any Scheme implementation that adheres to the IEEE standard

    Factors Affecting the Adoption of Faculty-Developed Academic Software: A Study of Five iCampus Projects

    Get PDF
    Instruction in higher education must adapt more rapidly to: changes in workforce needs, global issues, advances in disciplines, and resource constraints. The pace of such improvement depends on the speed with which new ideas and materials are adopted across institutions. In 1999 Microsoft pledged $25 million and staff support for iCampus, a seven-year MIT project to develop pioneering uses of educational technology. The TLT Group studied five iCampus projects in order to identify factors affecting institutionalization and widespread dissemination. Among the factors impeding adoption: lack of rewards and support for faculty to adopt innovations; faculty isolation; and a lack of attention to adoption issues among projects selected for funding. The study made recommendations for universities, foundations, government agencies and corporations: 1) continue making education more authentic, active, collaborative, and feedback-rich; 2) create demand to adopt ideas and materials from other sources by encouraging all faculty members to improve and document learning in their programs, year after year; 3) nurture coalitions for instructional improvement, across and within institutions; 4) create more effective higher education corporate alliances; and 5) improve institutional services to support faculty in educational design, software development, assessment methods, formative evaluation, and/or in sharing ideas with others who teach comparable courses

    Amorphous Computing

    Get PDF
    Amorphous computing is the development of organizational principles and programming languages for obtaining coherent behaviors from the cooperation of myriads of unreliable parts that are interconnected in unknown, irregular, and time-varying ways. The impetus for amorphous computing comes from developments in microfabrication and fundamental biology, each of which is the basis of a kernel technology that makes it possible to build or grow huge numbers of almost-identical information-processing units at almost no cost. This paper sets out a research agenda for realizing the potential of amorphous computing and surveys some initial progress, both in programming and in fabrication. We describe some approaches to programming amorphous systems, which are inspired by metaphors from biology and physics. We also present the basic ideas of cellular computing, an approach to constructing digital-logic circuits within living cells by representing logic levels by concentrations DNA-binding proteins

    Ethnic and gender differences in help seeking for substance disorders among Black Americans

    Full text link
    This paper uses the National Survey of American Life (NSAL) to examine within group differences regarding help-seeking for substance disorders among a US sample of African American and Caribbean Black men and women. We examined ethnic and gender differences in the type of providers sought for substance disorder treatment, as well as reasons for avoiding treatment. Results indicate that overall, few ethnic differences exist; however, African Americans are more likely than Caribbean Blacks to seek help from human service professionals (including a religious or spiritual advisor) and from informal sources of treatment such as self-help groups. Black men with a substance disorder were more likely to see a psychiatrist than Black women. Findings regarding reasons for avoiding treatment suggest that there may be a need to provide better education about the utility of substance disorder treatment, even before problems reach a high level of severity.The National Institute on Drug Abuse training grant #T32DA007267The National Institute of Mental Health training grant #T32 MH16806-25.The NSAL is supported by the National Institute of Mental Health (grant U01-MH57716) with supplemental support from the Office of Behavioral and Social Science Research at the National Institutes of Health and the University of Michigan.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/164713/1/Redmond2017_Article_EthnicAndGenderDifferencesInHe.pdfDescription of Redmond2017_Article_EthnicAndGenderDifferencesInHe.pdf : Main articl

    The Changing Landscape for Stroke\ua0Prevention in AF: Findings From the GLORIA-AF Registry Phase 2

    Get PDF
    Background GLORIA-AF (Global Registry on Long-Term Oral Antithrombotic Treatment in Patients with Atrial Fibrillation) is a prospective, global registry program describing antithrombotic treatment patterns in patients with newly diagnosed nonvalvular atrial fibrillation at risk of stroke. Phase 2 began when dabigatran, the first non\u2013vitamin K antagonist oral anticoagulant (NOAC), became available. Objectives This study sought to describe phase 2 baseline data and compare these with the pre-NOAC era collected during phase 1. Methods During phase 2, 15,641 consenting patients were enrolled (November 2011 to December 2014); 15,092 were eligible. This pre-specified cross-sectional analysis describes eligible patients\u2019 baseline characteristics. Atrial fibrillation disease characteristics, medical outcomes, and concomitant diseases and medications were collected. Data were analyzed using descriptive statistics. Results Of the total patients, 45.5% were female; median age was 71 (interquartile range: 64, 78) years. Patients were from Europe (47.1%), North America (22.5%), Asia (20.3%), Latin America (6.0%), and the Middle East/Africa (4.0%). Most had high stroke risk (CHA2DS2-VASc [Congestive heart failure, Hypertension, Age  6575 years, Diabetes mellitus, previous Stroke, Vascular disease, Age 65 to 74 years, Sex category] score  652; 86.1%); 13.9% had moderate risk (CHA2DS2-VASc = 1). Overall, 79.9% received oral anticoagulants, of whom 47.6% received NOAC and 32.3% vitamin K antagonists (VKA); 12.1% received antiplatelet agents; 7.8% received no antithrombotic treatment. For comparison, the proportion of phase 1 patients (of N = 1,063 all eligible) prescribed VKA was 32.8%, acetylsalicylic acid 41.7%, and no therapy 20.2%. In Europe in phase 2, treatment with NOAC was more common than VKA (52.3% and 37.8%, respectively); 6.0% of patients received antiplatelet treatment; and 3.8% received no antithrombotic treatment. In North America, 52.1%, 26.2%, and 14.0% of patients received NOAC, VKA, and antiplatelet drugs, respectively; 7.5% received no antithrombotic treatment. NOAC use was less common in Asia (27.7%), where 27.5% of patients received VKA, 25.0% antiplatelet drugs, and 19.8% no antithrombotic treatment. Conclusions The baseline data from GLORIA-AF phase 2 demonstrate that in newly diagnosed nonvalvular atrial fibrillation patients, NOAC have been highly adopted into practice, becoming more frequently prescribed than VKA in Europe and North America. Worldwide, however, a large proportion of patients remain undertreated, particularly in Asia and North America. (Global Registry on Long-Term Oral Antithrombotic Treatment in Patients With Atrial Fibrillation [GLORIA-AF]; NCT01468701

    The Bifurcation Interpreter: A Step Towards the Automatic Analysis of Dynamical Systems

    Get PDF
    The Bifurcation Interpreter is a computer program that autonomously explores the steady-state orbits of one-parameter families of periodically- driven oscillators. To report its findings, the Interpreter generates schematic diagrams and English text descriptions similar to those appearing in the science and engineering research literature. Given a system of equations as input, the Interpreter uses symbolic algebra to automatically generate numerical procedures that simulate the system. The Interpreter incorporates knowledge about dynamical systems theory, which it uses to guide the simulations, to interpret the results, and to minimize the effects of numerical error

    Towards a Theory of Local and Global in Computation

    Get PDF
    We formulate the rudiments of a method for assessing the difficulty of dividing a computational problem into "independent simpler parts." This work illustrates measures of complexity which attempt to capture the distinction between "local" and "global" computational problems. One such measure is the covering multiplicity, or average number of partial computations which take account of a given piece of data. Another measure reflects the intuitive notion of a "highly interconnected" computational problem, for which subsets of the data cannot be processed "in isolation." These ideas are applied in the setting of computational geometry to show that the connectivity predicate has unbounded convering multiplicity and is highly interconnected; and in the setting of numerical computations to measure the complexity of evaluating polynomials and solving systems of linear equations
    • …
    corecore